Sparse-View CT Reconstruction Based on Nonconvex L1 − L2 Regularizations

نویسندگان

  • Ali Pour Yazdanpanah
  • Farideh Foroozandeh Shahraki
  • Emma Regentova
چکیده

The reconstruction from sparse-view projections is one of important problems in computed tomography (CT) limited by the availability or feasibility of obtaining of a large number of projections. Traditionally, convex regularizers have been exploited to improve the reconstruction quality in sparse-view CT, and the convex constraint in those problems leads to an easy optimization process. However, convex regularizers often result in a biased approximation and inaccurate reconstruction in CT problems. Here, we present a nonconvex, Lipschitz continuous and non-smooth regularization model. The CT reconstruction is formulated as a nonconvex constrained L1 − L2 minimization problem and solved through a difference of convex algorithm and alternating direction of multiplier method which generates a better result than L0 or L1 regularizers in the CT reconstruction. We compare our method with previously reported high performance methods which use convex regularizers such as TV, wavelet, curvelet, and curvelet+TV (CTV) on the test phantom images. The results show that there are benefits in using the nonconvex regularizer in the sparse-view CT reconstruction. Keywords—Computed tomography, sparse-view reconstruction, L1 −L2 minimization, non-convex, difference of convex functions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-dose CT reconstruction via L1 dictionary learning regularization using iteratively reweighted least-squares

BACKGROUND In order to reduce the radiation dose of CT (computed tomography), compressed sensing theory has been a hot topic since it provides the possibility of a high quality recovery from the sparse sampling data. Recently, the algorithm based on DL (dictionary learning) was developed to deal with the sparse CT reconstruction problem. However, the existing DL algorithm focuses on the minimiz...

متن کامل

Fast and Robust Reconstruction for Fluorescence Molecular Tomography via L1-2 Regularization

Sparse reconstruction inspired by compressed sensing has attracted considerable attention in fluorescence molecular tomography (FMT). However, the columns of system matrix used for FMT reconstruction tend to be highly coherent, which means L1 minimization may not produce the sparsest solution. In this paper, we propose a novel reconstruction method by minimization of the difference of L1 and L2...

متن کامل

A Smoothing Descent Method for Nonconvex TV-Models

A novel class of variational models with nonconvex lq-normtype regularizations (0 < q < 1) is considered, which typically outperforms those models with convex regularizations in restoring sparse images. Due to the fact that the objective function is nonconvex and non-Lipschitz, such nonconvex models are very challenging from an analytical as well as numerical point of view. In this work we prop...

متن کامل

Nonconvex optimization for improved exploitation of gradient sparsity in CT image reconstruction

A nonconvex optimization algorithm is developed, which exploits gradient magnitude image (GMI) sparsity for reduction in the projection view angle sampling rate. The algorithm shows greater potential for exploiting GMI sparsity than can be obtained by convex total variation (TV) based optimization. The nonconvex algorithm is demonstrated in simulation with ideal, noiseless data for a 2D fan-bea...

متن کامل

A Smoothing Descent Method for Nonconvex TV $$^q$$ -Models

A novel class of variational models with nonconvex `q-normtype regularizations (0 < q < 1) is considered, which typically outperforms popular models with convex regularizations in restoring sparse images. Due to the fact that the objective function is nonconvex and nonLipschitz, such models are very challenging from an analytical as well as numerical point of view. In this work a smoothing desc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017